Homotopy Based Algorithms for ℓ0-Regularized Least-Squares
نویسندگان
چکیده
Sparse signal restoration is usually formulated as the minimization of a quadratic cost function ‖y−Ax‖ 2 , where A is a dictionary and x is an unknown sparse vector. It is well-known that imposing an l0 constraint leads to an NP-hard minimization problem. The convex relaxation approach has received considerable attention, where the l0-norm is replaced by the l1-norm. Among the many efficient l1 solvers, the homotopy algorithm minimizes ‖y −Ax‖ 2 + λ‖x‖1 with respect to x for a continuum of λ’s. It is inspired by the piecewise regularity of the l1-regularization path, also referred to as the homotopy path. In this paper, we address the minimization problem ‖y −Ax‖ 2 + λ‖x‖0 for a continuum of λ’s and propose two heuristic search algorithms for l0-homotopy. Continuation Single Best Replacement is a forward-backward greedy strategy extending the Single Best Replacement algorithm, previously proposed for l0-minimization at a given λ. The adaptive search of the λ-values is inspired by l1-homotopy. l0 Regularization Path Descent is a more complex algorithm exploiting the structural properties of the l0-regularization path, which is piecewise constant with respect to λ. Both algorithms are empirically evaluated for difficult inverse problems involving ill-conditioned dictionaries. Finally, we show that they can be easily coupled with usual methods of model order selection. This work was carried out in part while C. Soussen was visiting IRCCyN during the academic year 2010-2011 with the financial support of CNRS. C. Soussen and D. Brie are with the Université de Lorraine and CNRS at the Centre de Recherche en Automatique de Nancy (UMR 7039).Campus Sciences, B.P. 70239, F-54506 Vandœuvre-lès-Nancy, France. Tel: (+33)-3 83 59 56 43, Fax: (+33)-3 83 68 44 62. E-mail: [email protected], [email protected]. J. Idier is with L’UNAM Université, Ecole Centrale Nantes and CNRS at the Institut de Recherche en Communications et Cybernétique de Nantes (UMR 6597), 1 rue de la Noë, BP 92101, F-44321 Nantes Cedex 3, France. Tel: (+33)-2 40 37 69 09, Fax: (+33)-2 40 37 69 30. E-mail: [email protected]. J. Duan was with CRAN. He is now with the Department of Biomedical Engineering, Xi’an Jiaotong University. No. 28, Xianning West Road, Xi’an 710049, Shaanxi Province, China. Tel: (+86)-29-82 66 86 68, Fax: (+86)-29 82 66 76 67. E-mail: [email protected]. March 18, 2015 DRAFT
منابع مشابه
Group Sparse Recovery via the ℓ0(ℓ2) Penalty: Theory and Algorithm
In this work we propose and analyze a novel approach for recovering group sparse signals, which arise naturally in a number of practical applications. It is based on regularized least squares with an `(`) penalty. One distinct feature of the new approach is that it has the built-in decorrelation mechanism within each group, and thus can handle the challenging strong inner-group correlation. We ...
متن کاملDistributed Iterative Thresholding for ℓ0/ℓ1-Regularized Linear Inverse Problems
The `0/`1-regularized least squares approach is used to deal with linear inverse problems under sparsity constraints, which arise in mathematical and engineering fields, e.g., statistics, signal processing, machine learning, and coding theory. In particular, multi-agent models have been recently emerged in this context to describe diverse kinds of networked systems, ranging from medical databas...
متن کاملAlternating direction algorithms for ℓ0 regularization in compressed sensing
In this paper we propose three iterative greedy algorithms for compressed sensing, called iterative alternating direction (IAD), normalized iterative alternating direction (NIAD) and alternating direction pursuit (ADP), which stem from the iteration steps of alternating direction method of multiplier (ADMM) for `0-regularized least squares (`0-LS) and can be considered as the alternating direct...
متن کاملApplications of regularized least squares to pattern classification
We survey a number of recent results concerning the behaviour of algorithms for learning classifiers based on the solution of a regularized least-squares problem. c © 2007 Elsevier B.V. All rights reserved.
متن کاملA Continuous Exact ℓ0 Penalty (CEL0) for Least Squares Regularized Problem
Lemma 4.4 in [E. Soubies, L. Blanc-Féraud and G. Aubert, SIAM J. Imaging Sci., 8 (2015), pp. 1607–1639] is wrong for local minimizers of the continuous exact `0 (CEL0) functional. The argument used to conclude the proof of this lemma is not sufficient in the case of local minimizers. In this note, we supply a revision of this lemma where new results are established for local minimizers. Theorem...
متن کاملOptimal Rates of Sketched-regularized Algorithms for Least-Squares Regression over Hilbert Spaces
We investigate regularized algorithms combining with projection for least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space. We prove convergence results with respect to variants of norms, under a capacity assumption on the hypothesis space and a regularity condition on the target function. As a result, we obtain optimal r...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Signal Processing
دوره 63 شماره
صفحات -
تاریخ انتشار 2015